Greedy Aggregation for Vector Quantization

نویسندگان

  • Yair Koren
  • Irad Yavneh
چکیده

Abstract. Vector quantization is a classical problem that appears in many fields. Unfortunately, the quantization problem is generally non-convex, and therefore affords many local minima. The main problem is finding an initial approximation that is close to a “good” local minimum. Once such an approximation is found, the Lloyd–Max method may be used to reach a local minimum near it. In recent years, much improvement has been made with respect to reducing the computational costs of quantization algorithms, whereas the task of finding better initial approximations received somewhat less attention. We present a novel greedy algorithm for the vector quantization problem. The algorithm begins by allocating a large number of representation levels throughout the input domain and thereafter iteratively aggregates pairs of representation levels and replaces them by a single one. The pair of representation levels that is aggregated is the one yielding the minimal increase in distortion among all such pairs. Our method achieves better solutions than other contemporary methods such as LBG. In addition, its computational complexity is low, typically quantizing data in time that is equivalent to just a few Lloyd–Max iterations. When quantizing signals with sparse and patchy histograms, e.g. cartoons, the improvement in distortion relative to LBG may be arbitrarily large.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Greedy vector quantization

We investigate the greedy version of the L-optimal vector quantization problem for an Rvalued random vector X ∈ L. We show the existence of a sequence (aN )N≥1 such that aN minimizes a 7→ ∥min1≤i≤N−1 |X−ai| ∧ |X−a| ∥∥ Lp (L-mean quantization error at level N induced by (a1, . . . , aN−1, a)). We show that this sequence produces L -rate optimal N -tuples a = (a1, . . . , aN ) (i.e. the L -mean q...

متن کامل

Variable-length constrained-storage tree-structured vector quantization

Constrained storage vector quantization, (CSVQ), introduced by Chan and Gersho (1990, 1991) allows for the stagewise design of balanced tree-structured residual vector quantization codebooks with low encoding and storage complexities. On the other hand, it has been established by Makhoul et al. (1985), Riskin et al. (1991), and by Mahesh et al. (see IEEE Trans. Inform. Theory, vol.41, p.917-30,...

متن کامل

Greedy sparse decompositions: a comparative study

The purpose of this article is to present a comparative study of sparse greedy algorithms that were separately introduced in speech and audio research communities. It is particularly shown that the Matching Pursuit (MP) family of algorithms (MP, OMP, and OOMP) are equivalent to multi-stage gain-shape vector quantization algorithms previously designed for speech signals coding. These algorithms ...

متن کامل

Soft learning vector quantization and clustering algorithms based on ordered weighted aggregation operators

This paper presents the development and investigates the properties of ordered weighted learning vector quantization (LVQ) and clustering algorithms. These algorithms are developed by using gradient descent to minimize reformulation functions based on aggregation operators. An axiomatic approach provides conditions for selecting aggregation operators that lead to admissible reformulation functi...

متن کامل

Self-organizing maps for the design of multiple description vector quantizers

Multiple description coding is an appealing tool to guarantee graceful signal degradation in the presence of unreliable channels. While the principles of multiple description scalar quantization are wellunderstood and solid guidelines exist to design effective systems, the same does not hold for vector quantization, especially at low bit-rates, where burdensome and unsatisfactory design techniq...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2005